992 research outputs found

    An Overview of Progress and Problems in Educational Technology

    Get PDF
    Educational technologists have promised that great advances and improvements in learning and instruction would occur on account of new and emerging technologies. Some of these promises have been partially fulfilled, but many have not. The last decade of the previous century witnessed the consolidation of new approaches to learning and instruction under the banner of constructivism. This so−called new learning paradigm was really not all that new, but renewed emphasis on learners and learning effectiveness can clearly be counted as gains resulting from this constructivist consolidation within educational research. At the same time, technology was not standing still. Network technologies were increasing bandwidth, software engineering was embracing object orientation, and wireless technologies were extending accessibility. It is clear that we can now do things to improve education that were not possible twenty years ago. However, the potential gains in learning and instruction have yet to be realized on a significant global scale. Why not? Critical challenges confront instructional designers and critical problems remain with regard to learning in and about complex domains. Moreover, organizational issues required to translate advances in learning theory and educational technology into meaningful practice have yet to be addressed. The current situation in the field of educational technology is one of technification. New educational technologies are usable only by a scarce cadre of technocrats. Constructivist approaches to learning have been oversimplified to such a degree that learning effectiveness has lost meaning. As a consequence, education is generally managed in an ad hoc manner that marginalizes the potential gains offered by new learning technologies. This paper presents an overview of progress and problems in educational technology and argues that educational program management must be integrally linked with technology and theory in order for significant progress in learning and instruction to occur on a global scale.Educational technologists have promised that great advances and improvements in learning and instruction would occur on account of new and emerging technologies. Some of these promises have been partially fulfilled, but many have not. The last decade of the previous century witnessed the consolidation of new approaches to learning and instruction under the banner of constructivism. This so−called new learning paradigm was really not all that new, but renewed emphasis on learners and learning effectiveness can clearly be counted as gains resulting from this constructivist consolidation within educational research. At the same time, technology was not standing still. Network technologies were increasing bandwidth, software engineering was embracing object orientation, and wireless technologies were extending accessibility. It is clear that we can now do things to improve education that were not possible twenty years ago. However, the potential gains in learning and instruction have yet to be realized on a significant global scale. Why not? Critical challenges confront instructional designers and critical problems remain with regard to learning in and about complex domains. Moreover, organizational issues required to translate advances in learning theory and educational technology into meaningful practice have yet to be addressed. The current situation in the field of educational technology is one of technification. New educational technologies are usable only by a scarce cadre of technocrats. Constructivist approaches to learning have been oversimplified to such a degree that learning effectiveness has lost meaning. As a consequence, education is generally managed in an ad hoc manner that marginalizes the potential gains offered by new learning technologies. This paper presents an overview of progress and problems in educational technology and argues that educational program management must be integrally linked with technology and theory in order for significant progress in learning and instruction to occur on a global scale

    System dynamics advances strategic economic transition planning in a developing nation

    Get PDF
    The increasingly complex environment of today's world, characterized by technological innovation and global communication, generates myriads of possible and actual interactions while limited physical and intellectual resources severely impinge on decision makers, be it in the public or private domains. At the core of the decision-making process is the need for quality information that allows the decision maker to better assess the impact of decisions in terms of outcomes, nonlinear feedback processes and time delays on the performance of the complex system invoked. This volume is a timely review on the principles underlying complex decision making, the handling of uncertainties in dynamic envrionments and of the various modeling approaches used. The book consists of five parts, each composed of several chapters: I: Complex Decision Making: Concepts, Theories and Empirical Evidence II: Tools and Techniques for Decision Making in Complex Environments and Systems III: System Dynamics and Agent-Based Modeling IV: Methodological Issues V: Future Direction

    An Overview of Progress and Problems in Educational Technology

    Get PDF
    Educational technologists have promised that great advances and improvements in learning and instruction would occur on account of new and emerging technologies. Some of these promises have been partially fulfilled, but many have not. The last decade of the previous century witnessed the consolidation of new approaches to learning and instruction under the banner of constructivism. This so−called new learning paradigm was really not all that new, but renewed emphasis on learners and learning effectiveness can clearly be counted as gains resulting from this constructivist consolidation within educational research. At the same time, technology was not standing still. Network technologies were increasing bandwidth, software engineering was embracing object orientation, and wireless technologies were extending accessibility. It is clear that we can now do things to improve education that were not possible twenty years ago. However, the potential gains in learning and instruction have yet to be realized on a significant global scale. Why not? Critical challenges confront instructional designers and critical problems remain with regard to learning in and about complex domains. Moreover, organizational issues required to translate advances in learning theory and educational technology into meaningful practice have yet to be addressed. The current situation in the field of educational technology is one of technification. New educational technologies are usable only by a scarce cadre of technocrats. Constructivist approaches to learning have been oversimplified to such a degree that learning effectiveness has lost meaning. As a consequence, education is generally managed in an ad hoc manner that marginalizes the potential gains offered by new learning technologies. This paper presents an overview of progress and problems in educational technology and argues that educational program management must be integrally linked with technology and theory in order for significant progress in learning and instruction to occur on a global scale.Educational technologists have promised that great advances and improvements in learning and instruction would occur on account of new and emerging technologies. Some of these promises have been partially fulfilled, but many have not. The last decade of the previous century witnessed the consolidation of new approaches to learning and instruction under the banner of constructivism. This so−called new learning paradigm was really not all that new, but renewed emphasis on learners and learning effectiveness can clearly be counted as gains resulting from this constructivist consolidation within educational research. At the same time, technology was not standing still. Network technologies were increasing bandwidth, software engineering was embracing object orientation, and wireless technologies were extending accessibility. It is clear that we can now do things to improve education that were not possible twenty years ago. However, the potential gains in learning and instruction have yet to be realized on a significant global scale. Why not? Critical challenges confront instructional designers and critical problems remain with regard to learning in and about complex domains. Moreover, organizational issues required to translate advances in learning theory and educational technology into meaningful practice have yet to be addressed. The current situation in the field of educational technology is one of technification. New educational technologies are usable only by a scarce cadre of technocrats. Constructivist approaches to learning have been oversimplified to such a degree that learning effectiveness has lost meaning. As a consequence, education is generally managed in an ad hoc manner that marginalizes the potential gains offered by new learning technologies. This paper presents an overview of progress and problems in educational technology and argues that educational program management must be integrally linked with technology and theory in order for significant progress in learning and instruction to occur on a global scale

    Designing on-demand education for simultaneous development of domain-specific and self-directed learning skills

    Get PDF
    On-demand education enables individual learners to choose their learning pathways according to their own learning needs. They must use self-directed learning (SDL) skills involving self-assessment and task selection to determine appropriate pathways for learning. Learners who lack these skills must develop them because SDL skills are prerequisite to developing domain-specific skills. This article describes the design of an on-demand learning environment developed to enable novices to simultaneously develop their SDL and domain-specific skills. Learners received advice on their self-assessments and their selections of subsequent learning tasks. In the domain of system dynamics – a way to model a dynamic system and draw graphs depicting the system’s behaviour over time – advice on self-assessment is provided in a scoring rubric containing relevant performance standards. Advice on task selection indicates all relevant task aspects to be taken into account, including recommendations for suitable learning tasks which meet the individual learner’s needs. This article discusses the design of the environment and the learners’ perceptions of its usefulness. Most of the times, the learners found the advice appropriate and they followed it in 78% of their task selections

    The effect of FTO variation on increased osteoarthritis risk is mediated through body mass index : a mendelian randomisation study

    Get PDF
    Objective: Variation in the fat mass and obesity-associated (FTO) gene influences susceptibility to obesity. A variant in the FTO gene has been implicated in genetic risk to osteoarthritis (OA). We examined the role of the FTO polymorphism rs8044769 in risk of knee and hip OA in cases and controls incorporating body mass index (BMI) information. Methods: 5409 knee OA patients, 4355 hip OA patients and up to 5362 healthy controls from 7 independent cohorts from the UK and Australia were genotyped for rs8044769. The association of the FTO variant with OA was investigated in case/control analyses with and without BMI adjustment and in analyses matched for BMI category. A mendelian randomisation approach was employed using the FTO variant as the instrumental variable to evaluate the role of overweight on OA. Results: In the meta-analysis of all overweight (BMI≥25) samples versus normal-weight controls irrespective of OA status the association of rs8044769 with overweight is highly significant (OR[CIs] for allele G=1.14 [01.08 to 1.19], p=7.5×10−7). A significant association with knee OA is present in the analysis without BMI adjustment (OR[CIs]=1.08[1.02 to 1.14], p=0.009) but the signal fully attenuates after BMI adjustment (OR[CIs]=0.99[0.93 to 1.05], p=0.666). We observe no evidence for association in the BMI-matched meta-analyses. Using mendelian randomisation approaches we confirm the causal role of overweight on OA. Conclusions: Our data highlight the contribution of genetic risk to overweight in defining risk to OA but the association is exclusively mediated by the effect on BMI. This is consistent with what is known of the biology of the FTO gene and supports the causative role of high BMI in OA

    Attitudes and Performance: An Analysis of Russian Workers

    Full text link
    This paper investigates the relationship between locus of control and performance among Russian employees, using survey data collected at 28 workplaces in 2002 in Taganrog and at 47 workplaces in 2003 in Ekaterinburg. We develop a measure that allows us to categorize the Russian employees participating in our survey as exhibiting an internal or external locus of control. We then assess the extent to which there are significant differences between “internals” and “externals” in work-related attitudes that may affect performance. In particular, we focus on (1) attitudes about outcomes associated with hard work, (2) level of job satisfaction, (3) expectation of receiving a desired reward, and (4) loyalty to and involvement with one’s organization. In each case we identify where gender and generational differences emerge. Our main objective is to determine whether Russian employees who exhibit an internal locus of control perform better than employees with an external locus of control. Our performance measures include earnings, expected promotions, and assessments of the quantity and quality of work in comparison to others at the same organization doing a similar job. Controlling for a variety of worker characteristics, we find that (1) individuals who exhibit an internal locus of control perform better, but this result is not always statistically significant; (2) even among “internals,” women earn significantly less than men and have a much lower expectation of promotion; (3) even among “internals,” experience with unemployment has a negative influence on performance.http://deepblue.lib.umich.edu/bitstream/2027.42/40144/3/wp758.pd

    Antarctic ice sheet paleo-constraint database

    Get PDF
    We present a database of observational constraints on past Antarctic Ice Sheet changes during the last glacial cycle intended to consolidate the observations that represent our understanding of past Antarctic changes, for state-space estimation, and paleo-model calibrations. The database is a major expansion of the initial work of Briggs and Tarasov (2013). It includes new data types and multi-tier data quality assessment. The updated constraint database “AntICE2” consists of observations of past grounded and floating ice sheet extent, past ice thickness, past relative sea level, borehole temperature profiles, and present-day bedrock displacement rates. In addition to paleo-observations, the present-day ice sheet geometry and surface ice velocities are incorporated to constrain the present-day ice sheet configuration. The method by which the data is curated using explicitly defined criteria is detailed. Moreover, the observational uncertainties are specified. The methodology by which the constraint database can be applied to evaluate a given ice sheet reconstruction is discussed. The implementation of the “AntICE2” database for Antarctic Ice Sheet model calibrations will improve Antarctic Ice Sheet predictions during past warm and cold periods and yield more robust paleo model spin ups for forecasting future ice sheet changes

    Imputation of variants from the 1000 Genomes Project modestly improves known associations and can identify low-frequency variant-phenotype associations undetected by HapMap based imputation

    Get PDF
    notes: PMCID: PMC3655956This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.Genome-wide association (GWA) studies have been limited by the reliance on common variants present on microarrays or imputable from the HapMap Project data. More recently, the completion of the 1000 Genomes Project has provided variant and haplotype information for several million variants derived from sequencing over 1,000 individuals. To help understand the extent to which more variants (including low frequency (1% ≤ MAF <5%) and rare variants (<1%)) can enhance previously identified associations and identify novel loci, we selected 93 quantitative circulating factors where data was available from the InCHIANTI population study. These phenotypes included cytokines, binding proteins, hormones, vitamins and ions. We selected these phenotypes because many have known strong genetic associations and are potentially important to help understand disease processes. We performed a genome-wide scan for these 93 phenotypes in InCHIANTI. We identified 21 signals and 33 signals that reached P<5×10(-8) based on HapMap and 1000 Genomes imputation, respectively, and 9 and 11 that reached a stricter, likely conservative, threshold of P<5×10(-11) respectively. Imputation of 1000 Genomes genotype data modestly improved the strength of known associations. Of 20 associations detected at P<5×10(-8) in both analyses (17 of which represent well replicated signals in the NHGRI catalogue), six were captured by the same index SNP, five were nominally more strongly associated in 1000 Genomes imputed data and one was nominally more strongly associated in HapMap imputed data. We also detected an association between a low frequency variant and phenotype that was previously missed by HapMap based imputation approaches. An association between rs112635299 and alpha-1 globulin near the SERPINA gene represented the known association between rs28929474 (MAF = 0.007) and alpha1-antitrypsin that predisposes to emphysema (P = 2.5×10(-12)). Our data provide important proof of principle that 1000 Genomes imputation will detect novel, low frequency-large effect associations
    corecore